Race Conditions can occur when applications process (multi threaded) requests concurrently without proper protection resulting in unintended behavior. Time attacks and race conditions can result in data exposure, data loss or bypassing business logic.
Race Conditions
A race condition attack causes intentional collisions using timed request. Race conditions are vulnerabilities that occur because of a sequence of actions influencing the outcome in an unexpected way.
Multi-threaded programs are often more vulnerable because they share the same memory/resources simultaneously without guaranteed execution order.
Some examples
| Category | Function | Race Condition Risk |
|---|---|---|
| Payments | Balance deduction, checkout | High |
| Promotions | Coupon/promo code redemption | High |
| Gift cards | Balance usage | High |
| Rate limiting | Login attempts, OTP validation | High |
| Inventory | Stock checks, "last item" | High |
| Authentication | Password reset tokens, 2FA | Medium |
| Registration | Email uniqueness check | Medium |
| Sessions | Session creation/invalidation | Medium |
| File handling | Upload processing, virus scan | Medium |
| Cache | Read-miss-fetch-write cycles | Medium |
| Loyalty points | Earn/redeem points | High |
| Referral systems | Referral bonus claiming | High |
Limit overrun race conditions (TOCTOU)
When you can exceed the limit imposed by the application its called Limit Overrun race condition. An example would be:
- For a loyalty card adding points after purchase
- Redeem points for the card
- New purchase redeem points but now in 2 requests
- Observe getting 2x points
Variations of this kind of attack:
- Redeeming a gift card multiple times
- Rating a product multiple times
- Withdrawing or transferring cash in excess of your account balance
- Reusing a single CAPTCHA solution
- Bypassing an anti-brute-force rate limit
Custom Actions
Using custom actions in Burp we can easily test race conditions with below script
int NUMBER_OF_REQUESTS = 20;
var reqs = new ArrayList<HttpRequest>();
for (int i = 0; i < NUMBER_OF_REQUESTS; i++) {
reqs.add(requestResponse.request());
}
var responses = api().http().sendRequests(reqs);
var codes = responses.stream().map(HttpRequestResponse::response).map(HttpResponse::statusCode).toList();
logging().logToOutput(codes);![]()

Bypassing rate limiting in login form
If testing a rate limiting protection for example in a login form. Sometimes 50 requests can make the difference. https://medium.com/@syedshorox27/25000-from-login-bypassed-mfa-using-a-race-condition-jwt-leak-6139fcc22573
- Check if rate limiting is sessions based or not, test different usernames.
- Observe if rate limit is username based. For example after 3 attempts.
- This tells us the actual number of failed attempt is saved server-side.
- The server incrementing the counter for failed logins is linked to a username.
Sequential vs Parallel Requests
As we now know the server is incrementing we can now try some race conditions. Using Burp repeater we can save and duplicate 20 requests and send them either sequential or parallel.
| Mode | How Requests Are Sent | Overlap? |
|---|---|---|
| Sequential | One request finishes before next | ❌ No |
| Parallel | Multiple requests sent at same time | ✅ Yes |
Observe the responses, and see how responses differ in either error messages or other differences. As for example 3 attempts are allowed, if we are fast enough we might be able to attempt more than 3. Using Intruder we can send a login request using password stored in our clipboard.
def queueRequests(target, wordlists):
# as the target supports HTTP/2, use engine=Engine.BURP2 and concurrentConnections=1 for a single-packet attack
engine = RequestEngine(endpoint=target.endpoint,
concurrentConnections=1,
engine=Engine.BURP2
)
# assign the list of candidate passwords from your clipboard
passwords = wordlists.clipboard
# queue a login request using each password from the wordlist
# the 'gate' argument withholds the final part of each request until engine.openGate() is invoked
for password in passwords:
engine.queue(target.req, password, gate='1')
# once every request has been queued
# invoke engine.openGate() to send all requests in the given gate simultaneously
engine.openGate('1')
def handleResponse(req, interesting):
table.add(req)
PHP Session Files and File Locks
When trying race conditions in PHP applications, we must be aware of PHP Sesson Locks. PHP locks sessions files as soon as the scripts starts session_start() to prevent multiple files being written at the same time due to multi threading. So it will only processs 1 request at a time, other requests within the same session must wait untill the first request finishes.
How to bypass?
We can use different sessions by logging in many times and record session IDs. Then we can assign each request a different session ID, meaning each thread accesses a different session file, no need to wait for file locks.
Intruder script
python
def queueRequests(target, wordlists):
engine = RequestEngine(endpoint=target.endpoint,
concurrentConnections=30,
requestsPerConnection=100,
pipeline=False
)
# the 'gate' argument blocks the final byte of each request until openGate is invoked
for sess in ["p5b2nr48govua1ieljfdecppjg", "48ncr9hc1rjm361fp7h17110ar", "0411kdhfmca5uqiappmc3trgcg", "m3qv0d1qu7omrtm2rooivr7lc4", "onerh3j83jopd5ul8scjaf14rr"]:
engine.queue(target.req, sess, gate='race1')
# wait until every 'race1' tagged request is ready
# then send the final byte of each request
# (this method is non-blocking, just like queue)
engine.openGate('race1')
engine.complete(timeout=60)
def handleResponse(req, interesting):
table.add(req)
Hidden multi-step sequences
Its possible in MFA thats using multiple steps in 1 request. For example the user could be in a active login state while steps still need to complete which is a basic MFA bypass.
Predict
For successful collision you need two ore more request that operate on the same record. For example a password reset operation and of-course we need the critical endpoints with critical functionality.
Probe
To compare results we need baseline based on normal conditions. We can send 10 request in sequences to examine the responses for differences. Then send them in parallel and again check the responses. Look for what stands out, what is different.
Prove
After understanding the reponses apply race attacks to exploit the vulnerablities. Some of them can be really advance and takes a lot of research to find it and get it right.
Multi-endpoint race conditions
A multiple endpoint could be a validation and confirmation in 1 request. We can send requests to multiple endpoints at the same time. For example we can try to pay for an item then before the item is confirmed we use race conditions to add more items.
When payment validation and order confirmation are performed during processing a single request.
In this case we can potentially add more items in the race window when the payment however we can run into issues lining up the race window because of:
- Delays because of the network architecture
- Delays because of different endpoints varying in response times.
Connection warming
The back-end or servers keep requests in sync because of parallel requests. The first request to a server is slower because the connection is just being established. This delay is normal and not related to the endpoint you are testing. By sending a GET / request to home we can warm up the connection.
- Before the race, add the homepage at the start of the request group
- Send all requests using “Send group in sequence (single connection)”.
If the first request is slow but the following requests are fast, the delay was only due to the connection setup.